Multi-Level Cache Resizing

نویسندگان

  • Inseok Choi
  • Donald Yeung
چکیده

Hardware designers are constantly looking for ways to squeeze waste out of architectures to achieve better power efficiency. Cache resizing is a technique that can remove wasteful power consumption in caches. The idea is to determine the minimum cache a program needs to run at near-peak performance, and then reconfigure the cache to implement this efficient capacity. While there has been significant previous work on cache resizing, existing techniques have focused on controlling resizing for a single level of cache only. This sacrifices significant opportunities for power savings in modern CPU hierarchies which routinely employ 3 levels of cache. This paper investigates multi-level cache resizing (MCR). MCR independently resizes all caches in a modern cache hierarchy to minimize dynamic and static power consumption at all caching levels simultaneously. Specifically, we study a static-optimal version of MCR, and find resizing a 3-level hierarchy can reduce total energy dissipation by 58.9% with only 4.4% degradation in performance. Our study shows a non-trivial portion of this gain– 1 3rd for programs exhibiting good temporal locality–comes from optimizing the interactions between resizing decisions at different caching levels. We also propose several dynamic resizing algorithms that can automatically find good size configurations at runtime. Our results show dynamic MCR can achieve between 40–62% energy savings with slightly higher performance degradation than static-optimal MCR.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Symbiotic Cache Resizing for CMPs with Shared LLC

This paper investigates the problem of finding the optimal sizes of private caches and a shared LLC in CMPs. Resizing private and shared caches in modern CMPs is one way to squeeze wasteful power consumption out of architectures to improve power efficiency. However, shrinking each private/shared cache has different impact on the performance loss and the power savings to the CMPs because each ca...

متن کامل

A Unified DVFS-Cache Resizing Framework

Cache resizing and DVFS are two well-known techniques, employed to reduce leakage and dynamic power consumption respectively. Although extensively studied, these techniques have not been explored in combination. In this work we argue that optimal frequency and cache size are highly affected by each other, therefore should be studied together. We present a framework that drives DVFS and Cache Re...

متن کامل

Enabling Efficient Dynamic Resizing of Large DRAM Caches via A Hardware Consistent Hashing Mechanism

Die-stacked DRAM has been proposed for use as a large, high-bandwidth, last-level cache with hundreds or thousands of megabytes of capacity. Not all workloads (or phases) can productively utilize this much cache space, however. Unfortunately, the unused (or under-used) cache continues to consume power due to leakage in the peripheral circuitry and periodic DRAM refresh. Dynamically adjusting th...

متن کامل

Cache-Aware Utilization Control for Energy-Efficient Multi-Core Real-Time Systems

Multi-core processors are anticipated to become a major development platform for real-time systems. However, existing power management algorithms are not designed to sufficiently utilize the features available in many multi-core processors, such as shared L2 caches and per-core DVFS, to effectively minimize processor energy consumption while providing real-time guarantees. In this paper, we pro...

متن کامل

Exploiting Choice in Resizable Cache Design to Optimize Deep-Submicron Processor Energy-Delay

Cache memories account for a significant fraction of a chip’s overall energy dissipation. Recent research advocates using “resizable” caches to exploit cache requirement variability in applications to reduce cache size and eliminate energy dissipation in the cache’s unused sections with minimal impact on performance. Current proposals for resizable caches fundamentally vary in two design aspect...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012